Goto

Collaborating Authors

 graph neural network


Mean-field theory of graph neural networks in graph partitioning

Neural Information Processing Systems

A theoretical performance analysis of the graph neural network (GNN) is presented. For classification tasks, the neural network approach has the advantage in terms of flexibility that it can be employed in a data-driven manner, whereas Bayesian inference requires the assumption of a specific model. A fundamental question is then whether GNN has a high accuracy in addition to this flexibility. Moreover, whether the achieved performance is predominately a result of the backpropagation or the architecture itself is a matter of considerable interest. To gain a better insight into these questions, a mean-field theory of a minimal GNN architecture is developed for the graph partitioning problem. This demonstrates a good agreement with numerical experiments.


Semi-Supervised Learning on Graphs using Graph Neural Networks

Chen, Juntong, Donnat, Claire, Klopp, Olga, Schmidt-Hieber, Johannes

arXiv.org Machine Learning

Graph neural networks (GNNs) work remarkably well in semi-supervised node regression, yet a rigorous theory explaining when and why they succeed remains lacking. To address this gap, we study an aggregate-and-readout model that encompasses several common message passing architectures: node features are first propagated over the graph then mapped to responses via a nonlinear function. For least-squares estimation over GNNs with linear graph convolutions and a deep ReLU readout, we prove a sharp non-asymptotic risk bound that separates approximation, stochastic, and optimization errors. The bound makes explicit how performance scales with the fraction of labeled nodes and graph-induced dependence. Approximation guarantees are further derived for graph-smoothing followed by smooth nonlinear readouts, yielding convergence rates that recover classical nonparametric behavior under full supervision while characterizing performance when labels are scarce. Numerical experiments validate our theory, providing a systematic framework for understanding GNN performance and limitations.







AdversarialGraphAugmentationtoImprove GraphContrastiveLearning

Neural Information Processing Systems

Graph contrastivelearning (GCL), by training GNNs to maximize the correspondence between the representations of the same graph in its different augmented forms, may yield robust and transferable GNNs even without using labels.



HowPowerfulareK-hopMessagePassingGraph NeuralNetworks

Neural Information Processing Systems

Recently,researchers extended 1-hop message passing to K-hop message passing by aggregating information fromK-hop neighbors of nodes simultaneously. However, there is no work on analyzing the expressive powerofK-hopmessagepassing.